List of used packages: knitr dplyr ggplot2 broom reshape2 janitor plm pwt9 quarto renv shiny targets testthat tidyverse tibble usethis rio lubridate purrr Hmisc plotly hrbrthemes xts seasonal tsbox forecast tseries plotly ggridges shades urca
colorize <- function(x, color) {
if (knitr::is_latex_output()) {
sprintf("\\textcolor{%s}{%s}", color, x)
} else if (knitr::is_html_output()) {
sprintf("<span style='color: %s;'>%s</span>", color,
x)
} else x
}#webshot::install_phantomjs()The dataset used for this project is Daily Delhi Climate, which consists of the following columns: 1. date: Date of format YYYY-MM-DD starting from “2013-01-01” and ending in “2017-01-01”.
2. meantemp: Mean temperature averaged out from multiple 3 hour intervals in a day.
3. humidity: Humidity value for the day (units are grams of water vapor per cubic meter volume of air).
4. wind_speed: Wind speed measured in kmph.
5. mean_pressure: Pressure reading of weather (measure in atm)
Q1. How can I find out if analyzing meantemp indvidiually (univariate) suffices, or if including other columns and perform a multivariate analysis would worth the effort to find more meaninful patterns and forecasts?
The goal of this project is to analyze and forecast the mean temperature Delhi, which is recorded in the meantemp column. For this, after importing the dataset, outliers are removed in Preprocessing Section section. Then meantemp column is assigned to a time series object in Construct Time Series section for further processing, analysis, and forecast. After detecting seasonalities using plots, the time series is seasonally adjusted using X13-ARIMA-SEATS. Then, remaining trend is removed in detrend.
Before forecasting the time series, I check for stationarity of time series, as stationarity is an assumption in ARIMA model. For this purpose, unit root tests are applied in Stationarity section.
Finally, I used ARIMA model to forecast the time series in Forecast Time Series section.
df_train <- read_csv("data/DailyDelhiClimateTrain.csv")## Rows: 1462 Columns: 5
## ── Column specification ────────────────────────────────────────────────────────
## Delimiter: ","
## dbl (4): meantemp, humidity, wind_speed, meanpressure
## date (1): date
##
## ℹ Use `spec()` to retrieve the full column specification for this data.
## ℹ Specify the column types or set `show_col_types = FALSE` to quiet this message.
df_test <- read_csv("data/DailyDelhiClimateTest.csv")## Rows: 114 Columns: 5
## ── Column specification ────────────────────────────────────────────────────────
## Delimiter: ","
## dbl (4): meantemp, humidity, wind_speed, meanpressure
## date (1): date
##
## ℹ Use `spec()` to retrieve the full column specification for this data.
## ℹ Specify the column types or set `show_col_types = FALSE` to quiet this message.
summary(df_train)## date meantemp humidity wind_speed
## Min. :2013-01-01 Min. : 6.00 Min. : 13.43 Min. : 0.000
## 1st Qu.:2014-01-01 1st Qu.:18.86 1st Qu.: 50.38 1st Qu.: 3.475
## Median :2015-01-01 Median :27.71 Median : 62.62 Median : 6.222
## Mean :2015-01-01 Mean :25.50 Mean : 60.77 Mean : 6.802
## 3rd Qu.:2016-01-01 3rd Qu.:31.31 3rd Qu.: 72.22 3rd Qu.: 9.238
## Max. :2017-01-01 Max. :38.71 Max. :100.00 Max. :42.220
## meanpressure
## Min. : -3.042
## 1st Qu.:1001.580
## Median :1008.563
## Mean :1011.105
## 3rd Qu.:1014.945
## Max. :7679.333
df_train |> describe()## df_train
##
## 5 Variables 1462 Observations
## --------------------------------------------------------------------------------
## date
## n missing distinct Info Mean Gmd .05
## 1462 0 1462 1 2015-01-01 487.7 2013-03-15
## .10 .25 .50 .75 .90 .95
## 2013-05-27 2014-01-01 2015-01-01 2016-01-01 2016-08-07 2016-10-19
##
## lowest : 2013-01-01 2013-01-02 2013-01-03 2013-01-04 2013-01-05
## highest: 2016-12-28 2016-12-29 2016-12-30 2016-12-31 2017-01-01
## --------------------------------------------------------------------------------
## meantemp
## n missing distinct Info Mean Gmd .05 .10
## 1462 0 617 1 25.5 8.331 12.51 14.62
## .25 .50 .75 .90 .95
## 18.86 27.71 31.31 33.71 35.43
##
## lowest : 6.000000 7.000000 7.166667 7.400000 8.666667
## highest: 38.200000 38.272727 38.428571 38.500000 38.714286
## --------------------------------------------------------------------------------
## humidity
## n missing distinct Info Mean Gmd .05 .10
## 1462 0 897 1 60.77 18.96 29.00 36.21
## .25 .50 .75 .90 .95
## 50.38 62.62 72.22 81.85 86.99
##
## lowest : 13.42857 15.85714 18.46667 19.00000 19.50000
## highest: 96.12500 96.62500 96.85714 98.00000 100.00000
## --------------------------------------------------------------------------------
## wind_speed
## n missing distinct Info Mean Gmd .05 .10
## 1462 0 736 1 6.802 4.878 0.925 1.625
## .25 .50 .75 .90 .95
## 3.475 6.222 9.238 12.608 14.813
##
## lowest : 0.0000000 0.4625000 0.4750000 0.5285714 0.6166667
## highest: 27.7750000 30.6857143 33.3250000 34.4875000 42.2200000
## --------------------------------------------------------------------------------
## meanpressure
## n missing distinct Info Mean Gmd .05 .10
## 1462 0 626 1 1011 22.92 996.8 998.1
## .25 .50 .75 .90 .95
## 1001.6 1008.6 1014.9 1017.8 1019.3
##
## lowest : -3.041667 12.045455 310.437500 633.900000 938.066667
## highest: 1022.125000 1023.000000 1350.296296 1352.615385 7679.333333
##
## Value 0 300 600 900 1000 1400 7700
## Frequency 2 1 1 2 1453 2 1
## Proportion 0.001 0.001 0.001 0.001 0.994 0.001 0.001
##
## For the frequency table, variable is rounded to the nearest 100
## --------------------------------------------------------------------------------
Below we can see interactive plot of the time series.
p <- df_train |>
ggplot( aes(x=date, y=meantemp)) +
geom_area(fill="#69b3a2", alpha=0.5) +
geom_line(color="#69b3a2") +
ylab("bitcoin price ($)") +
theme_ipsum()
# Turn it interactive with ggplotly
p <- ggplotly(p)
#p
pWe can detect an outlier at the last observation (last row of dataframe). It causes an abrupt decrease in value of temperature. This would lead to problems in further analysis I will proceed and also when I later apply functions on the time series. Therefore, I replace the last observation’s value with its previous one.
Q2. Is imputing with the last observation the right approach here? Or is preferrable that I use median of last n (rows) for instance? Maybe it won’t matter, as when I aggregate data (per week, month, week, etc.), I remove the last observation.
previous_value <- df_train$meantemp[df_train$date == as.Date('2016-12-31')]
df_train$meantemp[df_train$date == as.Date('2017-01-01')]<- previous_value #df_train <- head(df_train, -1)
head(df_train)## # A tibble: 6 × 5
## date meantemp humidity wind_speed meanpressure
## <date> <dbl> <dbl> <dbl> <dbl>
## 1 2013-01-01 10 84.5 0 1016.
## 2 2013-01-02 7.4 92 2.98 1018.
## 3 2013-01-03 7.17 87 4.63 1019.
## 4 2013-01-04 8.67 71.3 1.23 1017.
## 5 2013-01-05 6 86.8 3.7 1016.
## 6 2013-01-06 7 82.8 1.48 1018
tail(df_train)## # A tibble: 6 × 5
## date meantemp humidity wind_speed meanpressure
## <date> <dbl> <dbl> <dbl> <dbl>
## 1 2016-12-27 16.8 67.6 8.34 1017.
## 2 2016-12-28 17.2 68.0 3.55 1016.
## 3 2016-12-29 15.2 87.9 6 1017.
## 4 2016-12-30 14.1 89.7 6.27 1018.
## 5 2016-12-31 15.1 87 7.32 1016.
## 6 2017-01-01 15.1 100 0 1016
Let us how the plot looks after removing the outlier:
p <- df_train |>
ggplot( aes(x=date, y=meantemp)) +
geom_area(fill="#69b3a2", alpha=0.5) +
geom_line(color="#69b3a2") +
ylab("bitcoin price ($)") +
theme_ipsum()
# Turn it interactive with ggplotly
p <- ggplotly(p)
#p
pFind if there is any missing dates
date_range <- seq(min(df_train$date), max(df_train$date), by = 1)
date_range[!date_range %in% df_train$date] ## Date of length 0
summary(df_test)## date meantemp humidity wind_speed
## Min. :2017-01-01 Min. :11.00 Min. :17.75 Min. : 1.387
## 1st Qu.:2017-01-29 1st Qu.:16.44 1st Qu.:39.62 1st Qu.: 5.564
## Median :2017-02-26 Median :19.88 Median :57.75 Median : 8.069
## Mean :2017-02-26 Mean :21.71 Mean :56.26 Mean : 8.144
## 3rd Qu.:2017-03-26 3rd Qu.:27.71 3rd Qu.:71.90 3rd Qu.:10.069
## Max. :2017-04-24 Max. :34.50 Max. :95.83 Max. :19.314
## meanpressure
## Min. : 59
## 1st Qu.:1007
## Median :1013
## Mean :1004
## 3rd Qu.:1017
## Max. :1023
df_test |> describe()## df_test
##
## 5 Variables 114 Observations
## --------------------------------------------------------------------------------
## date
## n missing distinct Info Mean Gmd .05
## 114 0 114 1 2017-02-26 38.33 2017-01-06
## .10 .25 .50 .75 .90 .95
## 2017-01-12 2017-01-29 2017-02-26 2017-03-26 2017-04-12 2017-04-18
##
## lowest : 2017-01-01 2017-01-02 2017-01-03 2017-01-04 2017-01-05
## highest: 2017-04-20 2017-04-21 2017-04-22 2017-04-23 2017-04-24
## --------------------------------------------------------------------------------
## meantemp
## n missing distinct Info Mean Gmd .05 .10
## 114 0 105 1 21.71 7.226 13.22 14.75
## .25 .50 .75 .90 .95
## 16.44 19.88 27.71 31.00 32.67
##
## lowest : 11.00000 11.72222 11.78947 12.11111 13.04167
## highest: 32.90000 33.50000 34.00000 34.25000 34.50000
## --------------------------------------------------------------------------------
## humidity
## n missing distinct Info Mean Gmd .05 .10
## 114 0 109 1 56.26 21.97 26.74 29.49
## .25 .50 .75 .90 .95
## 39.62 57.75 71.90 78.42 82.20
##
## lowest : 17.75000 19.42857 21.12500 24.12500 26.00000
## highest: 83.52632 84.44444 85.86957 91.64286 95.83333
## --------------------------------------------------------------------------------
## wind_speed
## n missing distinct Info Mean Gmd .05 .10
## 114 0 109 1 8.144 4.031 2.842 3.715
## .25 .50 .75 .90 .95
## 5.564 8.069 10.069 13.464 14.353
##
## lowest : 1.387500 1.625000 1.854545 1.950000 2.100000
## highest: 14.384615 15.512500 16.662500 17.590000 19.314286
## --------------------------------------------------------------------------------
## meanpressure
## n missing distinct Info Mean Gmd .05 .10
## 114 0 109 1 1004 23.16 1002 1004
## .25 .50 .75 .90 .95
## 1007 1013 1017 1019 1021
##
## lowest : 59.000 998.625 999.875 1000.875 1001.600
## highest: 1021.375 1021.556 1021.789 1021.958 1022.810
##
## Value 60 1000 1010 1020
## Frequency 1 14 53 46
## Proportion 0.009 0.123 0.465 0.404
##
## For the frequency table, variable is rounded to the nearest 10
## --------------------------------------------------------------------------------
xts_test_meantemp <- xts(df_test$meantemp, order.by=df_test$date, "%Y-%m-%d")head(xts_test_meantemp)## [,1]
## 2017-01-01 15.91304
## 2017-01-02 18.50000
## 2017-01-03 17.11111
## 2017-01-04 18.70000
## 2017-01-05 18.38889
## 2017-01-06 19.31818
tail(xts_test_meantemp)## [,1]
## 2017-04-19 33.500
## 2017-04-20 34.500
## 2017-04-21 34.250
## 2017-04-22 32.900
## 2017-04-23 32.875
## 2017-04-24 32.000
ts_plot(xts_test_meantemp)ts_test_meantemp <- ts_ts(xts_test_meantemp)
xts_week_test_meantemp <- apply.weekly(xts_test_meantemp,sum)
ts_week_test_meantemp <- na.remove(ts_ts(xts_week_test_meantemp))
#ts_week_test_meantemp <- as.ts(xts_week_test_meantemp)length(ts_week_test_meantemp)## [1] 18
ts_plot(xts_week_test_meantemp)Now we use the meantemp column to create our time series data. I assigned the time series to xts objects. But since many functions later require ts object, each time I define an xts, I also convert it to ts object using tsbox::ts_ts
min(df_train$date)## [1] "2013-01-01"
max(df_train$date)## [1] "2017-01-01"
#ts_train <- zoo(df_train$meantemp, df_train$date)
xts_train_meantemp <- xts(df_train$meantemp, order.by=df_train$date, "%Y-%m-%d")
class(xts_train_meantemp)## [1] "xts" "zoo"
head(xts_train_meantemp)## [,1]
## 2013-01-01 10.000000
## 2013-01-02 7.400000
## 2013-01-03 7.166667
## 2013-01-04 8.666667
## 2013-01-05 6.000000
## 2013-01-06 7.000000
tail(xts_train_meantemp)## [,1]
## 2016-12-27 16.85000
## 2016-12-28 17.21739
## 2016-12-29 15.23810
## 2016-12-30 14.09524
## 2016-12-31 15.05263
## 2017-01-01 15.05263
# convert xts to ts
## Create a daily Date object for ts
#inds <- seq(as.Date("2013-01-01"), as.Date("2017-01-01"), by = "day")
#set.seed(25)
#ts_train <- ts(df_train$meantemp, # random data
# start = c(2013, as.numeric(format(inds[1], "%j"))),
# frequency = 365)
#ts_train <- ts(df_train$meantemp, start = decimal_date(ymd("2013-01-01")), frequency = 365.25 / 7)Convert XTS objects to TS objects:
ts_train_meantemp <-ts_ts(xts_train_meantemp)
head(ts_train_meantemp)## Time Series:
## Start = 2013
## End = 2013.01368953503
## Frequency = 365.2425
## [1] 10.000000 7.400000 7.166667 8.666667 6.000000 7.000000
tail(ts_train_meantemp)## Time Series:
## Start = 2016.98639260218
## End = 2017.00008213721
## Frequency = 365.2425
## [1] 16.85000 17.21739 15.23810 14.09524 15.05263 15.05263
I plot a static plot as well:
ts_plot(xts_train_meantemp)From the initial plot I judge that there is seasonali. For more delicate observation to find if there is more granular periods of seasonality, I use seasonality plots. Before that, I aggregate data weekly, monthly, and quarterly.
# Weekly mean temperature
xts_week_train_meantemp <- apply.weekly(xts_train_meantemp,sum)
ts_week_train_meantemp <-ts_ts(xts_week_train_meantemp)
# Monthly mean temperature
xts_mon_train_meantemp <- aggregate(xts_train_meantemp, by=as.yearmon, FUN=sum)
ts_mon_train_meantemp <-ts_ts(xts_mon_train_meantemp)
# Quarterly mean temperature
xts_quar_train_meantemp <- aggregate(xts_train_meantemp, as.yearqtr, FUN=sum)
ts_quar_train_meantemp <-ts_ts(xts_quar_train_meantemp)
# Yearly mean temperate
as.year <- function(x) as.integer(as.yearmon(x))
xts_year_train_meantemp <- aggregate(xts_train_meantemp, by=as.year, FUN=sum)
#ts_year_train_meantemp <-ts_ts(xts_year_train_meantemp)
#xts_year_train_meantemp[1]The year 2017 has only one observation, so I remove it from all the aggregated datasets. I couldn’t do it before aggregating, otherwise I would have confronted the error Error: series has no regular pattern.
xts_week_train_meantemp <- head(xts_week_train_meantemp, -1)
xts_mon_train_meantemp <- head(xts_mon_train_meantemp, -1)
xts_quar_train_meantemp <- head(xts_quar_train_meantemp, -1)
ts_week_train_meantemp <- head(ts_week_train_meantemp, -1)
ts_mon_train_meantemp <- head(ts_mon_train_meantemp, -1)
ts_quar_train_meantemp <- head(ts_quar_train_meantemp, -1)#options(repr.plot.width = 7, repr.plot.height =20)
forecast::ggseasonplot(ts_mon_train_meantemp, year.labels=TRUE, year.labels.left=TRUE, labelgap = 0.1) +
ylab("degree") +
ggtitle("Seasonal plot: Monthly Mean Temperature")forecast::ggseasonplot(ts_mon_train_meantemp, year.labels=TRUE, year.labels.left=TRUE, labelgap = 0.1, polar=TRUE) +
ylab("degree") +
ggtitle("Polar Seasonal plot: Monthly Mean Temperature")#options(repr.plot.width = 7, repr.plot.height =20)
forecast::ggseasonplot(ts_quar_train_meantemp, year.labels=TRUE, year.labels.left=TRUE, labelgap = 0.1) +
ylab("degree") +
ggtitle("Seasonal plot: Quarterly Mean Temperature")forecast::ggseasonplot(ts_quar_train_meantemp, year.labels=TRUE, year.labels.left=TRUE, labelgap = 0.1, polar=TRUE) +
ylab("degree") +
ggtitle("Polar Seasonal plot: Quarterly Mean Temperature")If I need to remove different periods of seasonality together, I would need to use the forecast:msts function. For instance in below I remove weekly and yearly seasonality together.
des_ts_train_meantemp <- msts(xts_train_meantemp,seasonal.periods = c(7,365))
#head(des_xts_train)
#library(tsbox)
#ts_train <-ts_ts(xts_train)
#ts_train
class(des_ts_train_meantemp)## [1] "msts" "ts"
However, since its output had an unfamiliar and weird shape to me, and also since I wasn’t sure it uses the state-of-the-art X13 decomposition, I incorporated the X-13ARIMA-SEATS using seasonal:seas function. However, it has some limitations, as stated in the package’s reference manua. For instance, the number of observations must not exceed 780. Nor should maximum seasonal period exceed 12. That is why I couldn’t use original data ts_train and also the weekly aggregated data ts_week_train, as I would confront the error Seasonal period too large. The only possible aggregated data with highest frequency possible was monthly aggregated, ts_mon_train. However, I am concerned that I would lose significant pattern and information with this amount of aggregation.
Q3. If you could kindly share your viewpoint here, it would be very helpful for me to ensure how to proceed.
length(xts_train_meantemp)## [1] 1462
length(ts_train_meantemp)## [1] 1462
length(xts_train_meantemp)## [1] 1462
nowXTS <-ts_xts(ts_train_meantemp)
length(nowXTS)## [1] 1462
length(ts_week_train_meantemp)## [1] 208
plot(ts_week_train_meantemp)length(ts_week_train_meantemp)## [1] 208
plot(ts_train_meantemp)length(ts_train_meantemp)## [1] 1462
plot(ts_mon_train_meantemp)length(ts_mon_train_meantemp)## [1] 48
m <- seas(ts_mon_train_meantemp)
ts_train_adj_meantemp <- final(m)
#ts_train_adj
length(ts_train_adj_meantemp)## [1] 48
m <- seas(ts_mon_train_meantemp)
ts_train_adj_meantemp <- final(m)
#ts_train_adj
length(ts_train_adj_meantemp)## [1] 48
plot(ts_train_adj_meantemp)Plot original data along with trend and seasonally adjusted data
#ts_train
#series(m, "forecast.forecasts")
#out(m)
#seasadj(m)
autoplot(ts_mon_train_meantemp, series="Original Data") +
autolayer(trendcycle(m), series="Trend") +
autolayer(seasadj(m), series="Seasonally Adjusted") +
xlab("Year") + ylab("Mean Temperature") +
ggtitle("Mean Temperature Decomposed using X13") +
scale_colour_manual(values=c("gray","blue","red"),
breaks=c("Original Data","Seasonally Adjusted","Trend"))#ap < ggplotly(ap)In the seasonally adjusted time series ts_train_adj, I detected a trend, therefore I detrend it using differencing.
#ts_train_adj_meantemp |> log() |> nsdiffs(alpha=0.01) -> ts_train_adj_det_meantemp
ts_train_adj_meantemp |> log() |> diff() -> ts_train_adj_det_meantempplot(ts_train_adj_det_meantemp)#plot(d)ggAcf(ts_week_train_meantemp, lag=50)pacf (ts_week_train_meantemp, lag=50, pl = TRUE)ggAcf(ts_train_adj_meantemp, lag=10)pacf (ts_train_adj_meantemp, lag=10, pl = TRUE)ggAcf(ts_train_adj_det_meantemp, lag=10)pacf (ts_train_adj_det_meantemp, lag=10, pl = TRUE)ts_train_meantemp |> adf.test()##
## Augmented Dickey-Fuller Test
##
## data: ts_train_meantemp
## Dickey-Fuller = -1.9871, Lag order = 11, p-value = 0.5838
## alternative hypothesis: stationary
ts_week_train_meantemp |> adf.test()##
## Augmented Dickey-Fuller Test
##
## data: ts_week_train_meantemp
## Dickey-Fuller = -3.8729, Lag order = 5, p-value = 0.01651
## alternative hypothesis: stationary
ts_train_adj_meantemp |> adf.test() ##
## Augmented Dickey-Fuller Test
##
## data: ts_train_adj_meantemp
## Dickey-Fuller = -2.5897, Lag order = 3, p-value = 0.3386
## alternative hypothesis: stationary
ts_train_adj_det_meantemp |> adf.test() ## Warning in adf.test(ts_train_adj_det_meantemp): p-value smaller than printed p-
## value
##
## Augmented Dickey-Fuller Test
##
## data: ts_train_adj_det_meantemp
## Dickey-Fuller = -4.698, Lag order = 3, p-value = 0.01
## alternative hypothesis: stationary
ts_train_meantemp |> ur.kpss() |> summary()##
## #######################
## # KPSS Unit Root Test #
## #######################
##
## Test is of type: mu with 7 lags.
##
## Value of test-statistic is: 0.5812
##
## Critical value for a significance level of:
## 10pct 5pct 2.5pct 1pct
## critical values 0.347 0.463 0.574 0.739
ts_week_train_meantemp |> ur.kpss() |> summary()##
## #######################
## # KPSS Unit Root Test #
## #######################
##
## Test is of type: mu with 4 lags.
##
## Value of test-statistic is: 0.1618
##
## Critical value for a significance level of:
## 10pct 5pct 2.5pct 1pct
## critical values 0.347 0.463 0.574 0.739
ts_train_adj_meantemp |> ur.kpss() |> summary()##
## #######################
## # KPSS Unit Root Test #
## #######################
##
## Test is of type: mu with 3 lags.
##
## Value of test-statistic is: 0.9974
##
## Critical value for a significance level of:
## 10pct 5pct 2.5pct 1pct
## critical values 0.347 0.463 0.574 0.739
ts_train_adj_det_meantemp |> ur.kpss() |> summary()##
## #######################
## # KPSS Unit Root Test #
## #######################
##
## Test is of type: mu with 3 lags.
##
## Value of test-statistic is: 0.0595
##
## Critical value for a significance level of:
## 10pct 5pct 2.5pct 1pct
## critical values 0.347 0.463 0.574 0.739
ts_train_meantemp |> ur.df() |> summary()##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression none
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -9.6532 -0.8403 0.1233 1.0729 6.5542
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## z.lag.1 -0.001560 0.001622 -0.962 0.336
## z.diff.lag -0.158894 0.025836 -6.150 9.98e-10 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.644 on 1458 degrees of freedom
## Multiple R-squared: 0.02616, Adjusted R-squared: 0.02483
## F-statistic: 19.59 on 2 and 1458 DF, p-value: 4.036e-09
##
##
## Value of test-statistic is: -0.9621
##
## Critical values for test statistics:
## 1pct 5pct 10pct
## tau1 -2.58 -1.95 -1.62
ts_week_train_meantemp |> ur.df() |> summary()##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression none
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -35.318 -10.542 1.810 9.478 33.612
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## z.lag.1 -0.002234 0.005344 -0.418 0.676
## z.diff.lag -0.050285 0.068684 -0.732 0.465
##
## Residual standard error: 14.27 on 204 degrees of freedom
## Multiple R-squared: 0.003633, Adjusted R-squared: -0.006135
## F-statistic: 0.3719 on 2 and 204 DF, p-value: 0.6899
##
##
## Value of test-statistic is: -0.418
##
## Critical values for test statistics:
## 1pct 5pct 10pct
## tau1 -2.58 -1.95 -1.62
ts_train_adj_meantemp |> ur.df() |> summary()##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression none
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -51.384 -13.881 -1.239 16.337 55.819
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## z.lag.1 0.00375 0.00445 0.843 0.4040
## z.diff.lag -0.46189 0.13223 -3.493 0.0011 **
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 23.31 on 44 degrees of freedom
## Multiple R-squared: 0.2201, Adjusted R-squared: 0.1846
## F-statistic: 6.208 on 2 and 44 DF, p-value: 0.004217
##
##
## Value of test-statistic is: 0.8427
##
## Critical values for test statistics:
## 1pct 5pct 10pct
## tau1 -2.62 -1.95 -1.61
ts_train_adj_det_meantemp |> ur.df() |> summary()##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression none
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -0.062429 -0.012003 0.005823 0.020854 0.073373
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## z.lag.1 -1.8387 0.2483 -7.407 3.33e-09 ***
## z.diff.lag 0.2468 0.1441 1.713 0.0939 .
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.0295 on 43 degrees of freedom
## Multiple R-squared: 0.7547, Adjusted R-squared: 0.7433
## F-statistic: 66.16 on 2 and 43 DF, p-value: 7.539e-14
##
##
## Value of test-statistic is: -7.4065
##
## Critical values for test statistics:
## 1pct 5pct 10pct
## tau1 -2.62 -1.95 -1.61
seasonal=TRUE, as in cases 3m I use data that I seasonally adjusted them already. Setting seasonal=TRUE makes the model more time-consuming.#forecast_ts_train_meantemp = auto.arima(ts_train_meantemp,
# trace = TRUE,
# seasonal=TRUE,
# stepwise=FALSE,
# approximation=FALSE)
#checkresiduals(forecast_ts_train_meantemp)forecast_ts_train_meantemp <- auto.arima(ts_train_meantemp,
d = 1,
D = 1,
start.p = 2,
start.q = 3,
max.p = 2,
#max.d = 1,
max.q = 3,
start.P = 0,
start.Q = 0,
max.P = 0,
#max.D = 1,
max.Q = 0,
trace = TRUE,
seasonal=TRUE,
stepwise=TRUE,
approximation=FALSE
#nmodels=
)##
## ARIMA(2,1,3)(0,1,0)[365] : 4789.495
## ARIMA(0,1,0)(0,1,0)[365] : 4943.724
## ARIMA(1,1,0)(0,1,0)[365] : 4926.887
## ARIMA(0,1,1)(0,1,0)[365] : 4920.711
## ARIMA(1,1,3)(0,1,0)[365] : 4789.51
## ARIMA(2,1,2)(0,1,0)[365] : Inf
## ARIMA(1,1,2)(0,1,0)[365] : 4790.218
##
## Best model: ARIMA(2,1,3)(0,1,0)[365]
checkresiduals(forecast_ts_train_meantemp)##
## Ljung-Box test
##
## data: Residuals from ARIMA(2,1,3)(0,1,0)[365]
## Q* = 363.76, df = 287, p-value = 0.001427
##
## Model df: 5. Total lags used: 292
forecast_ts_train_meantemp## Series: ts_train_meantemp
## ARIMA(2,1,3)(0,1,0)[365]
##
## Coefficients:
## ar1 ar2 ma1 ma2 ma3
## 0.0104 0.4362 -0.2760 -0.6134 -0.0888
## s.e. 0.3402 0.2577 0.3395 0.3585 0.0480
##
## sigma^2 = 4.561: log likelihood = -2388.71
## AIC=4789.42 AICc=4789.5 BIC=4819.41
seasonal=TRUE, as in case 3, I use data that I seasonally adjusted them already. Setting seasonal=TRUE makes the model more time-consuming.#forecast_ts_week_train_meantemp = auto.arima(ts_week_train_meantemp,
# trace = TRUE,
# seasonal=TRUE,
# stepwise=FALSE,
# approximation=FALSE)
#checkresiduals(forecast_ts_week_train_meantemp)#forecast_ts_week_train_meantempforecast_ts_train_adj_meantemp = auto.arima(ts_train_adj_meantemp,
trace = TRUE,
seasonal= FALSE,
stepwise=FALSE,
approximation=FALSE)##
## ARIMA(0,1,0) : 441.3883
## ARIMA(0,1,0) with drift : 443.1154
## ARIMA(0,1,1) : 429.9577
## ARIMA(0,1,1) with drift : 429.5371
## ARIMA(0,1,2) : 432.2379
## ARIMA(0,1,2) with drift : 431.8095
## ARIMA(0,1,3) : 434.4112
## ARIMA(0,1,3) with drift : 433.7996
## ARIMA(0,1,4) : 434.6353
## ARIMA(0,1,4) with drift : Inf
## ARIMA(0,1,5) : Inf
## ARIMA(0,1,5) with drift : Inf
## ARIMA(1,1,0) : 432.8756
## ARIMA(1,1,0) with drift : 434.1589
## ARIMA(1,1,1) : 432.2366
## ARIMA(1,1,1) with drift : 431.7635
## ARIMA(1,1,2) : 434.6252
## ARIMA(1,1,2) with drift : Inf
## ARIMA(1,1,3) : 436.6632
## ARIMA(1,1,3) with drift : Inf
## ARIMA(1,1,4) : 436.8366
## ARIMA(1,1,4) with drift : Inf
## ARIMA(2,1,0) : 432.5313
## ARIMA(2,1,0) with drift : 433.449
## ARIMA(2,1,1) : 434.5063
## ARIMA(2,1,1) with drift : 433.7124
## ARIMA(2,1,2) : 436.9823
## ARIMA(2,1,2) with drift : Inf
## ARIMA(2,1,3) : Inf
## ARIMA(2,1,3) with drift : Inf
## ARIMA(3,1,0) : 434.8565
## ARIMA(3,1,0) with drift : 435.9464
## ARIMA(3,1,1) : 436.788
## ARIMA(3,1,1) with drift : Inf
## ARIMA(3,1,2) : Inf
## ARIMA(3,1,2) with drift : Inf
## ARIMA(4,1,0) : 436.5007
## ARIMA(4,1,0) with drift : 437.448
## ARIMA(4,1,1) : Inf
## ARIMA(4,1,1) with drift : 436.8826
## ARIMA(5,1,0) : 436.434
## ARIMA(5,1,0) with drift : 436.606
##
##
##
## Best model: ARIMA(0,1,1) with drift
checkresiduals(forecast_ts_train_adj_meantemp)##
## Ljung-Box test
##
## data: Residuals from ARIMA(0,1,1) with drift
## Q* = 7.4191, df = 9, p-value = 0.5936
##
## Model df: 1. Total lags used: 10
forecast_ts_train_adj_meantemp## Series: ts_train_adj_meantemp
## ARIMA(0,1,1) with drift
##
## Coefficients:
## ma1 drift
## -0.6827 1.9880
## s.e. 0.1279 1.0526
##
## sigma^2 = 488.7: log likelihood = -211.49
## AIC=428.98 AICc=429.54 BIC=434.53
forecast_ts_train_adj_det_meantemp = auto.arima(ts_train_adj_det_meantemp,
trace = TRUE,
seasonal= FALSE,
stepwise=FALSE,
approximation=FALSE)##
## ARIMA(0,0,0) with zero mean : -183.1965
## ARIMA(0,0,0) with non-zero mean : -181.4467
## ARIMA(0,0,1) with zero mean : -195.2457
## ARIMA(0,0,1) with non-zero mean : -195.724
## ARIMA(0,0,2) with zero mean : -192.9693
## ARIMA(0,0,2) with non-zero mean : -193.439
## ARIMA(0,0,3) with zero mean : -190.7534
## ARIMA(0,0,3) with non-zero mean : -191.4021
## ARIMA(0,0,4) with zero mean : -190.4085
## ARIMA(0,0,4) with non-zero mean : Inf
## ARIMA(0,0,5) with zero mean : Inf
## ARIMA(0,0,5) with non-zero mean : Inf
## ARIMA(1,0,0) with zero mean : -191.9675
## ARIMA(1,0,0) with non-zero mean : -190.6383
## ARIMA(1,0,1) with zero mean : -192.971
## ARIMA(1,0,1) with non-zero mean : -193.4768
## ARIMA(1,0,2) with zero mean : -190.5819
## ARIMA(1,0,2) with non-zero mean : Inf
## ARIMA(1,0,3) with zero mean : -188.4672
## ARIMA(1,0,3) with non-zero mean : Inf
## ARIMA(1,0,4) with zero mean : -188.2137
## ARIMA(1,0,4) with non-zero mean : Inf
## ARIMA(2,0,0) with zero mean : -192.6467
## ARIMA(2,0,0) with non-zero mean : -191.6913
## ARIMA(2,0,1) with zero mean : -190.668
## ARIMA(2,0,1) with non-zero mean : -191.4938
## ARIMA(2,0,2) with zero mean : -188.1949
## ARIMA(2,0,2) with non-zero mean : Inf
## ARIMA(2,0,3) with zero mean : Inf
## ARIMA(2,0,3) with non-zero mean : Inf
## ARIMA(3,0,0) with zero mean : -190.3099
## ARIMA(3,0,0) with non-zero mean : -189.1897
## ARIMA(3,0,1) with zero mean : -188.4346
## ARIMA(3,0,1) with non-zero mean : Inf
## ARIMA(3,0,2) with zero mean : Inf
## ARIMA(3,0,2) with non-zero mean : Inf
## ARIMA(4,0,0) with zero mean : -188.6455
## ARIMA(4,0,0) with non-zero mean : -187.6592
## ARIMA(4,0,1) with zero mean : Inf
## ARIMA(4,0,1) with non-zero mean : -188.2978
## ARIMA(5,0,0) with zero mean : -188.6625
## ARIMA(5,0,0) with non-zero mean : -188.4274
##
##
##
## Best model: ARIMA(0,0,1) with non-zero mean
checkresiduals(forecast_ts_train_adj_det_meantemp)##
## Ljung-Box test
##
## data: Residuals from ARIMA(0,0,1) with non-zero mean
## Q* = 7.1212, df = 8, p-value = 0.5236
##
## Model df: 1. Total lags used: 9
#checkresiduals(forecast_ts_train_meantemp)
forecast_ts_train_adj_det_meantemp## Series: ts_train_adj_det_meantemp
## ARIMA(0,0,1) with non-zero mean
##
## Coefficients:
## ma1 mean
## -0.6984 0.0025
## s.e. 0.1247 0.0013
##
## sigma^2 = 0.0008149: log likelihood = 101.14
## AIC=-196.28 AICc=-195.72 BIC=-190.73
Based on the results from the forecast, we have: AIC=4789.42 AICc=4789.5 BIC=4819.41
Now I manually compute the following evaluation metrics between prediction and test data: RMSE, MAE, \(R^2\) score.
forecast <- forecast_ts_train_meantemp |> forecast(h=114)
forecast## Point Forecast Lo 80 Hi 80 Lo 95 Hi 95
## 2017.0028 15.78625 13.049155 18.52334 11.600225 19.97227
## 2017.0056 17.39240 13.996459 20.78833 12.198758 22.58603
## 2017.0083 17.62886 13.909235 21.34849 11.940183 23.31754
## 2017.0110 19.27296 15.433348 23.11256 13.400782 25.14513
## 2017.0138 19.09082 15.182132 22.99951 13.112996 25.06865
## 2017.0165 17.51121 13.572824 21.44960 11.487969 23.53445
## 2017.0192 17.89843 13.941383 21.85548 11.846649 23.95021
## 2017.0220 17.68640 13.719861 21.65294 11.620102 23.75270
## 2017.0247 17.82473 13.851529 21.79793 11.748244 23.90121
## 2017.0275 20.08364 16.106289 24.06099 14.000806 26.16648
## 2017.0302 20.35622 16.375570 24.33686 14.268344 26.44409
## 2017.0329 17.65600 13.672870 21.63913 11.564329 23.74767
## 2017.0357 15.09612 11.110809 19.08142 9.001115 21.19112
## 2017.0384 15.69787 11.710681 19.68505 9.599992 21.79574
## 2017.0412 16.09903 12.110083 20.08797 9.998463 22.19959
## 2017.0439 15.36647 11.375888 19.35705 9.263402 21.46954
## 2017.0466 14.45746 10.465292 18.44963 8.351967 20.56295
## 2017.0494 14.16733 10.173625 18.16103 8.059487 20.27517
## 2017.0521 14.28839 10.293174 18.28360 8.178236 20.39854
## 2017.0548 13.83437 9.837670 17.83108 7.721943 19.94680
## 2017.0576 16.53864 12.540459 20.53682 10.423949 22.65333
## 2017.0603 13.28871 9.289057 17.28836 7.171769 19.40565
## 2017.0631 13.76792 9.766804 17.76904 7.648741 19.88710
## 2017.0658 16.66378 12.661208 20.66636 10.542372 22.78520
## 2017.0685 19.68464 15.680604 23.68867 13.560996 25.80828
## 2017.0713 18.95846 14.952971 22.96395 12.832594 25.08432
## 2017.0740 21.66383 17.656884 25.67077 15.535737 27.79191
## 2017.0767 22.24419 18.235795 26.25258 16.113879 28.37450
## 2017.0795 19.47634 15.466490 23.48618 13.343807 25.60886
## 2017.0822 17.94749 13.936196 21.95879 11.812745 24.08224
## 2017.0850 17.36801 13.355261 21.38075 11.231043 23.50497
## 2017.0877 15.22634 11.212147 19.24053 9.087161 21.36552
## 2017.0904 18.46498 14.449336 22.48062 12.323583 24.60637
## 2017.0932 14.76801 10.750919 18.78510 8.624400 20.91162
## 2017.0959 17.43468 13.416139 21.45321 11.288854 23.58050
## 2017.0986 17.72634 13.706359 21.74633 11.578308 23.87438
## 2017.1014 19.19225 15.170822 23.21368 13.042006 25.34250
## 2017.1041 19.87057 15.847698 23.89345 13.718116 26.02303
## 2017.1069 20.23468 16.210356 24.25900 14.080009 26.38934
## 2017.1096 21.78884 17.763078 25.81461 15.631967 27.94572
## 2017.1123 21.30134 17.274134 25.32855 15.142258 27.46043
## 2017.1151 19.16801 15.139357 23.19666 13.006717 25.32930
## 2017.1178 19.74420 15.714105 23.77430 13.580701 25.90770
## 2017.1206 19.70134 15.669805 23.73288 13.535638 25.86705
## 2017.1233 20.31563 16.282649 24.34861 14.147718 26.48354
## 2017.1260 20.81563 16.781207 24.85005 14.645514 26.98574
## 2017.1288 21.56801 17.532147 25.60387 15.395691 27.74033
## 2017.1315 25.25519 21.217886 29.29249 19.080668 31.42971
## 2017.1342 25.72634 21.687600 29.76509 19.549619 31.90307
## 2017.1370 23.10134 19.061161 27.14152 16.922418 29.28027
## 2017.1397 23.10134 19.059722 27.14296 16.920217 29.28247
## 2017.1425 23.52991 19.486855 27.57297 17.346589 29.71324
## 2017.1452 23.78884 19.744346 27.83334 17.603319 29.97437
## 2017.1479 24.66384 20.617909 28.70978 18.476120 30.85157
## 2017.1507 24.90134 20.853972 28.94871 18.711423 31.09126
## 2017.1534 24.90134 20.852536 28.95015 18.709226 31.09346
## 2017.1561 25.10134 21.051100 29.15159 18.907031 31.29566
## 2017.1589 25.97634 21.924664 30.02802 19.779835 32.17285
## 2017.1616 27.01801 22.964896 31.07112 20.819308 33.21671
## 2017.1644 27.03468 22.980129 31.08922 20.833781 33.23557
## 2017.1671 28.10134 24.045362 32.15732 21.898255 34.30443
## 2017.1698 29.41384 25.356428 33.47126 23.208563 35.61912
## 2017.1726 26.03468 21.975829 30.09352 19.827205 32.24215
## 2017.1753 24.91384 20.853563 28.97412 18.704181 31.12350
## 2017.1780 25.81563 21.753917 29.87734 19.603777 32.02748
## 2017.1808 25.52991 21.466772 29.59306 19.315874 31.74395
## 2017.1835 26.10134 22.036770 30.16592 19.885115 32.31757
## 2017.1863 27.66384 23.597839 31.72985 21.445427 33.88226
## 2017.1890 27.16801 23.100576 31.23544 20.947407 33.38861
## 2017.1917 26.66384 22.594981 30.73271 20.441055 32.88663
## 2017.1945 26.35134 22.281052 30.42163 20.126370 32.57632
## 2017.1972 24.47634 20.404624 28.54806 18.249186 30.70350
## 2017.1999 26.16801 22.094863 30.24116 19.938669 32.39735
## 2017.2027 26.03884 21.964269 30.11342 19.807319 32.27037
## 2017.2054 28.41384 24.337842 32.48984 22.180137 34.64755
## 2017.2082 28.28884 24.211416 32.36627 22.052956 34.52473
## 2017.2109 28.88706 24.808204 32.96591 22.648990 35.12512
## 2017.2136 29.23468 25.154398 33.31495 22.994429 35.47492
## 2017.2164 28.72634 24.644640 32.80805 22.483917 34.96877
## 2017.2191 27.16384 23.080716 31.24697 20.919239 33.40845
## 2017.2219 28.30134 24.216792 32.38589 22.054561 34.54812
## 2017.2246 30.23468 26.148702 34.32065 23.985718 36.48363
## 2017.2273 31.97634 27.888946 36.06374 25.725209 38.22748
## 2017.2301 26.76801 22.679191 30.85683 20.514700 33.02132
## 2017.2328 28.35134 24.261102 32.44158 22.095859 34.60683
## 2017.2355 28.03468 23.943014 32.12634 21.777019 34.29233
## 2017.2383 29.22634 25.133260 33.31943 22.966513 35.48617
## 2017.2410 31.67277 27.578268 35.76727 25.410769 37.93477
## 2017.2438 32.10134 28.005420 36.19727 25.837169 38.36552
## 2017.2465 32.67277 28.575429 36.77011 26.406427 38.93912
## 2017.2492 34.41384 30.315082 38.51260 28.145329 40.68236
## 2017.2520 35.41384 31.313664 39.51402 29.143160 41.68453
## 2017.2547 34.91384 30.812246 39.01544 28.640992 41.18669
## 2017.2574 34.41384 30.310829 38.51686 28.138824 40.68886
## 2017.2602 33.47634 29.371912 37.58077 27.199157 39.75353
## 2017.2629 32.03468 27.928829 36.14052 25.755324 38.31403
## 2017.2657 31.36801 27.260746 35.47527 25.086492 37.64953
## 2017.2684 32.83468 28.725997 36.94336 26.550994 39.11836
## 2017.2711 34.35134 30.241249 38.46144 28.065497 40.63719
## 2017.2739 31.90134 27.789835 36.01285 25.613334 38.18935
## 2017.2766 32.30134 28.188421 36.41426 26.011172 38.59151
## 2017.2793 33.85134 29.737008 37.96568 27.559010 40.14368
## 2017.2821 35.22634 31.110595 39.34209 28.931849 41.52084
## 2017.2848 35.72634 31.609182 39.84350 29.429689 42.02300
## 2017.2876 37.78884 33.670270 41.90742 31.490029 44.08766
## 2017.2903 36.76801 32.648025 40.88799 30.467037 43.06898
## 2017.2930 36.72634 32.604948 40.84774 30.423213 43.02947
## 2017.2958 36.10134 31.978537 40.22415 29.796056 42.40663
## 2017.2985 36.16384 32.039627 40.28806 29.856399 42.47129
## 2017.3013 36.10134 31.975718 40.22697 29.791744 42.41094
## 2017.3040 35.35134 31.224309 39.47838 29.039589 41.66310
## 2017.3067 34.01801 29.889567 38.14645 27.704101 40.33192
## 2017.3095 33.41384 29.283992 37.54369 27.097781 39.72991
## 2017.3122 33.85134 29.720084 37.98260 27.533128 40.16956
predicted <- as.numeric(forecast$mean)
actual <- as.numeric(ts_test_meantemp)rmse(predicted, actual)## [1] 4.298832
mae(predicted, actual)## [1] 3.575725
rsq <- function (x, y) cor(x, y) ^ 2
rsq(actual, predicted)## [1] 0.8174112
autoplot(forecast(forecast_ts_train_meantemp))# + autolayer(xts_test_meantemp)length(ts_test_meantemp)## [1] 114
forecast_ts_train_meantemp |> forecast(h=114) |>
autoplot() + autolayer(ts_test_meantemp)#autoplot(forecast(ts_week_train_meantemp)) #+ autolayer(ts_week_test_meantemp)#forecast_ts_train_adj + ts_train_adj
autoplot(forecast(forecast_ts_train_adj_meantemp))autoplot(forecast(forecast_ts_train_adj_det_meantemp))Let us illustrate plot of forecasting the test data (using forecast from case 1) joint with test data.
#ts_plot(ts_test_meantemp, forecast$mean)
#ts.union(ts_test_meantemp, forecast$mean)
#forecast$mean
xts_temp <- xts(ts_test_meantemp, order.by=df_test$date, "%Y-%m-%d")
xts_temp_2 <- xts(forecast$mean, order.by=df_test$date, "%Y-%m-%d")
xts_temp## [,1]
## 2017-01-01 15.91304
## 2017-01-02 18.50000
## 2017-01-03 17.11111
## 2017-01-04 18.70000
## 2017-01-05 18.38889
## 2017-01-06 19.31818
## 2017-01-07 14.70833
## 2017-01-08 15.68421
## 2017-01-09 14.57143
## 2017-01-10 12.11111
## 2017-01-11 11.00000
## 2017-01-12 11.78947
## 2017-01-13 13.23529
## 2017-01-14 13.20000
## 2017-01-15 16.43478
## 2017-01-16 14.65000
## 2017-01-17 11.72222
## 2017-01-18 13.04167
## 2017-01-19 14.61905
## 2017-01-20 15.26316
## 2017-01-21 15.39130
## 2017-01-22 18.44000
## 2017-01-23 18.11765
## 2017-01-24 18.34783
## 2017-01-25 21.00000
## 2017-01-26 16.17857
## 2017-01-27 16.50000
## 2017-01-28 14.86364
## 2017-01-29 15.66667
## 2017-01-30 16.44444
## 2017-01-31 16.12500
## 2017-02-01 15.25000
## 2017-02-02 17.09091
## 2017-02-03 15.63636
## 2017-02-04 18.70000
## 2017-02-05 18.63158
## 2017-02-06 16.88889
## 2017-02-07 15.12500
## 2017-02-08 15.70000
## 2017-02-09 15.37500
## 2017-02-10 14.66667
## 2017-02-11 15.62500
## 2017-02-12 16.25000
## 2017-02-13 16.33333
## 2017-02-14 16.87500
## 2017-02-15 17.57143
## 2017-02-16 20.25000
## 2017-02-17 21.30000
## 2017-02-18 21.12500
## 2017-02-19 22.36364
## 2017-02-20 23.37500
## 2017-02-21 21.83333
## 2017-02-22 19.12500
## 2017-02-23 18.62500
## 2017-02-24 19.12500
## 2017-02-25 19.00000
## 2017-02-26 18.75000
## 2017-02-27 19.87500
## 2017-02-28 23.33333
## 2017-03-01 24.46154
## 2017-03-02 23.75000
## 2017-03-03 20.50000
## 2017-03-04 19.12500
## 2017-03-05 19.75000
## 2017-03-06 20.00000
## 2017-03-07 22.62500
## 2017-03-08 21.54545
## 2017-03-09 20.78571
## 2017-03-10 19.93750
## 2017-03-11 18.53333
## 2017-03-12 17.37500
## 2017-03-13 17.44444
## 2017-03-14 18.00000
## 2017-03-15 19.87500
## 2017-03-16 24.00000
## 2017-03-17 20.90000
## 2017-03-18 24.69231
## 2017-03-19 24.66667
## 2017-03-20 23.33333
## 2017-03-21 25.00000
## 2017-03-22 27.25000
## 2017-03-23 28.00000
## 2017-03-24 28.91667
## 2017-03-25 26.50000
## 2017-03-26 29.10000
## 2017-03-27 29.50000
## 2017-03-28 29.88889
## 2017-03-29 31.00000
## 2017-03-30 29.28571
## 2017-03-31 30.62500
## 2017-04-01 31.37500
## 2017-04-02 29.75000
## 2017-04-03 30.50000
## 2017-04-04 30.93333
## 2017-04-05 29.23077
## 2017-04-06 31.22222
## 2017-04-07 27.00000
## 2017-04-08 25.62500
## 2017-04-09 27.12500
## 2017-04-10 27.85714
## 2017-04-11 29.25000
## 2017-04-12 29.25000
## 2017-04-13 29.66667
## 2017-04-14 30.50000
## 2017-04-15 31.22222
## 2017-04-16 31.00000
## 2017-04-17 32.55556
## 2017-04-18 34.00000
## 2017-04-19 33.50000
## 2017-04-20 34.50000
## 2017-04-21 34.25000
## 2017-04-22 32.90000
## 2017-04-23 32.87500
## 2017-04-24 32.00000
xts_temp_2## [,1]
## 2017-01-01 15.78625
## 2017-01-02 17.39240
## 2017-01-03 17.62886
## 2017-01-04 19.27296
## 2017-01-05 19.09082
## 2017-01-06 17.51121
## 2017-01-07 17.89843
## 2017-01-08 17.68640
## 2017-01-09 17.82473
## 2017-01-10 20.08364
## 2017-01-11 20.35622
## 2017-01-12 17.65600
## 2017-01-13 15.09612
## 2017-01-14 15.69787
## 2017-01-15 16.09903
## 2017-01-16 15.36647
## 2017-01-17 14.45746
## 2017-01-18 14.16733
## 2017-01-19 14.28839
## 2017-01-20 13.83437
## 2017-01-21 16.53864
## 2017-01-22 13.28871
## 2017-01-23 13.76792
## 2017-01-24 16.66378
## 2017-01-25 19.68464
## 2017-01-26 18.95846
## 2017-01-27 21.66383
## 2017-01-28 22.24419
## 2017-01-29 19.47634
## 2017-01-30 17.94749
## 2017-01-31 17.36801
## 2017-02-01 15.22634
## 2017-02-02 18.46498
## 2017-02-03 14.76801
## 2017-02-04 17.43468
## 2017-02-05 17.72634
## 2017-02-06 19.19225
## 2017-02-07 19.87057
## 2017-02-08 20.23468
## 2017-02-09 21.78884
## 2017-02-10 21.30134
## 2017-02-11 19.16801
## 2017-02-12 19.74420
## 2017-02-13 19.70134
## 2017-02-14 20.31563
## 2017-02-15 20.81563
## 2017-02-16 21.56801
## 2017-02-17 25.25519
## 2017-02-18 25.72634
## 2017-02-19 23.10134
## 2017-02-20 23.10134
## 2017-02-21 23.52991
## 2017-02-22 23.78884
## 2017-02-23 24.66384
## 2017-02-24 24.90134
## 2017-02-25 24.90134
## 2017-02-26 25.10134
## 2017-02-27 25.97634
## 2017-02-28 27.01801
## 2017-03-01 27.03468
## 2017-03-02 28.10134
## 2017-03-03 29.41384
## 2017-03-04 26.03468
## 2017-03-05 24.91384
## 2017-03-06 25.81563
## 2017-03-07 25.52991
## 2017-03-08 26.10134
## 2017-03-09 27.66384
## 2017-03-10 27.16801
## 2017-03-11 26.66384
## 2017-03-12 26.35134
## 2017-03-13 24.47634
## 2017-03-14 26.16801
## 2017-03-15 26.03884
## 2017-03-16 28.41384
## 2017-03-17 28.28884
## 2017-03-18 28.88706
## 2017-03-19 29.23468
## 2017-03-20 28.72634
## 2017-03-21 27.16384
## 2017-03-22 28.30134
## 2017-03-23 30.23468
## 2017-03-24 31.97634
## 2017-03-25 26.76801
## 2017-03-26 28.35134
## 2017-03-27 28.03468
## 2017-03-28 29.22634
## 2017-03-29 31.67277
## 2017-03-30 32.10134
## 2017-03-31 32.67277
## 2017-04-01 34.41384
## 2017-04-02 35.41384
## 2017-04-03 34.91384
## 2017-04-04 34.41384
## 2017-04-05 33.47634
## 2017-04-06 32.03468
## 2017-04-07 31.36801
## 2017-04-08 32.83468
## 2017-04-09 34.35134
## 2017-04-10 31.90134
## 2017-04-11 32.30134
## 2017-04-12 33.85134
## 2017-04-13 35.22634
## 2017-04-14 35.72634
## 2017-04-15 37.78884
## 2017-04-16 36.76801
## 2017-04-17 36.72634
## 2017-04-18 36.10134
## 2017-04-19 36.16384
## 2017-04-20 36.10134
## 2017-04-21 35.35134
## 2017-04-22 34.01801
## 2017-04-23 33.41384
## 2017-04-24 33.85134
ts_plot(xts_temp, xts_temp_2)Q4. I did all unit root tests and ARIMA models on all the following datasets: original time series, deseasonalized time series, and combination of deaseonalized and detrended time series. Judging by the plots, the adjusted versions performed poorly when fed to the model compared to feeding the weekly aggregated data that is fed to AUTOARIMA but with autmatic seasonality modelling of seasonality. Could you please share your viewpoint regarding this?
Now we model a multivariate time series by using both the columns meantemp and wind_speed.
We plot wind_speed time series first:
We use an interactive plot.
p2 <- df_train |>
ggplot( aes(x=date, y=wind_speed)) +
geom_area(fill="#69b3a2", alpha=0.5) +
geom_line(color="#69b3a2") +
ylab("bitcoin price ($)") +
theme_ipsum()
# Turn it interactive with ggplotly
p2 <- ggplotly(p2)
#p
p2Now we use a static plot.
#xts_train_meantemp <- xts(df_train$meantemp, order.by=df_train$date, "%Y-%m-%d")
#ts_train_meantemp <-ts_ts(xts_train_meantemp)
xts_train_windspeed <- xts(df_train$wind_speed, order.by=df_train$date, "%Y-%m-%d")
ts_train_windspeed <-ts_ts(xts_train_windspeed)ts_plot(ts_train_windspeed)We must deal with anomalies first.
xts_train_windspeed <- tsclean(xts_train_windspeed)ts_plot(xts_train_windspeed)Let us now do the same with test data of wind_speed column, as we need it later for evaluation:
xts_test_windspeed <- xts(df_test$wind_speed, order.by=df_test$date, "%Y-%m-%d")head(xts_test_windspeed)## [,1]
## 2017-01-01 2.743478
## 2017-01-02 2.894444
## 2017-01-03 4.016667
## 2017-01-04 4.545000
## 2017-01-05 3.300000
## 2017-01-06 8.681818
tail(xts_test_windspeed)## [,1]
## 2017-04-19 9.02500
## 2017-04-20 5.56250
## 2017-04-21 6.96250
## 2017-04-22 8.89000
## 2017-04-23 9.96250
## 2017-04-24 12.15714
ts_plot(xts_test_windspeed) We remove anomalies in the test data too:
xts_test_windspeed <- tsclean(xts_test_windspeed)
ts_test_windspeed <- ts_ts(xts_test_windspeed)ts_plot(xts_test_windspeed)In what follows, interactive plot of both time series are illustrated:
fig <- plot_ly(df_train, type = 'scatter', mode = 'lines')%>%
add_trace(x = ~date, y = ~meantemp, name = 'MeanTemp')%>%
add_trace(x = ~date, y = ~wind_speed, name = 'WindSpeed')%>%
layout(title = 'custom tick labels',legend=list(title=list(text='variable')),
xaxis = list(dtick = "M1", tickformat= "%b\n%Y"), width = 2000)## Warning: Specifying width/height in layout() is now deprecated.
## Please specify in ggplotly() or plot_ly()
options(warn = -1)
fig <- fig %>%
layout(
xaxis = list(zerolinecolor = '#ffff',
zerolinewidth = 2,
gridcolor = 'ffff', tickangle = 0),
yaxis = list(zerolinecolor = '#ffff',
zerolinewidth = 2,
gridcolor = 'ffff'),
plot_bgcolor='#e5ecf6')
figWe aggregate the windspeed time series by weekly
# Weekly mean temperature
xts_week_train_windspeed <- apply.weekly(xts_train_windspeed, sum)
ts_week_train_windspeed <- ts_ts(xts_week_train_windspeed)
xts_week_test_windspeed <- apply.weekly(xts_test_windspeed, sum)
ts_week_test_windspeed <- na.remove(ts_ts(xts_week_test_windspeed))
#ts_week_test_windspeed <- as.ts(xts_week_test_windspeed)ts_week_test_meantemp## Time Series:
## Start = 2017
## End = 2017.30938349179
## Frequency = 54.9479867256584
## [1] 15.91304 122.41073 92.34209 103.12740 120.67435 117.87830 109.63056
## [8] 135.81840 139.83333 150.79487 140.80200 149.57842 188.10000 211.42460
## [15] 201.63632 208.74603 234.58056 32.00000
## attr(,"na.removed")
## [1] 2 3 4 5 6 7 9 10 11 12 13 14 16 17 18 19 20 21 23
## [20] 24 25 26 27 28 30 31 32 33 34 35 37 38 39 40 41 42 44 45
## [39] 46 47 48 49 51 52 53 54 55 56 58 59 60 61 62 63 65 66 67
## [58] 68 69 70 72 73 74 75 76 77 79 80 81 82 83 84 86 87 88 89
## [77] 90 91 93 94 95 96 97 98 100 101 102 103 104 105 107 108 109 110 111
## [96] 112
Let us plot static plots for them individually as well:
ts_plot(ts_week_train_windspeed)ts_plot(xts_train_windspeed)Now we create a union of both time series and store it.
#VAR_data <- ts.union(ts_train_meantemp, ts_train_windspeed)
VAR_data <- ts.union(ts_week_train_meantemp, ts_week_train_windspeed)
colnames(VAR_data) <- cbind("meantemp","wind_speed")
#v1 <- cbind(ts_week_train_meantemp, ts_week_train_windspeed)
#colnames(v1) <- cbind("meantemp","wind_speed")#lagselect <- VARselect(v1, type = "both")
#lagselect$selectionVAR_data <- na.remove(VAR_data)
#tail(v1)We look at different lags suggested by different criteria if we use VAR model.
lagselect <- VARselect(VAR_data, season=12, type = "both")
lagselect$selection## AIC(n) HQ(n) SC(n) FPE(n)
## 10 6 1 10
lagselect$criteria## 1 2 3 4 5 6
## AIC(n) 10.86018 10.81179 10.74853 10.72197 10.70787 10.63106
## HQ(n) 11.06185 11.04034 11.00397 11.00430 11.01708 10.96716
## SC(n) 11.35841 11.37644 11.37961 11.41948 11.47181 11.46143
## FPE(n) 52091.96344 49644.23391 46616.68238 45413.78546 44800.44647 41513.24904
## 7 8 9 10
## AIC(n) 10.61618 10.62058 10.63627 10.60996
## HQ(n) 10.97918 11.01047 11.05304 11.05362
## SC(n) 11.51298 11.58381 11.66592 11.70605
## FPE(n) 40929.31852 41143.70844 41833.75446 40791.90935
Now that we have merged the column meantemp with wind_speed, we use VAR models with lag to be 10.
VAR_est <- VAR(y = VAR_data, season=8, type="both", p=10)
VAR_est##
## VAR Estimation Results:
## =======================
##
## Estimated coefficients for equation meantemp:
## =============================================
## Call:
## meantemp = meantemp.l1 + wind_speed.l1 + meantemp.l2 + wind_speed.l2 + meantemp.l3 + wind_speed.l3 + meantemp.l4 + wind_speed.l4 + meantemp.l5 + wind_speed.l5 + meantemp.l6 + wind_speed.l6 + meantemp.l7 + wind_speed.l7 + meantemp.l8 + wind_speed.l8 + meantemp.l9 + wind_speed.l9 + meantemp.l10 + wind_speed.l10 + const + trend + sd1 + sd2 + sd3 + sd4 + sd5 + sd6 + sd7
##
## meantemp.l1 wind_speed.l1 meantemp.l2 wind_speed.l2 meantemp.l3
## 0.689509677 -0.036305048 0.122053939 0.130967121 0.049179176
## wind_speed.l3 meantemp.l4 wind_speed.l4 meantemp.l5 wind_speed.l5
## 0.046573897 -0.002112818 0.110696976 0.238902095 -0.050100942
## meantemp.l6 wind_speed.l6 meantemp.l7 wind_speed.l7 meantemp.l8
## -0.146334371 0.144786326 -0.113654255 -0.002846842 -0.148801275
## wind_speed.l8 meantemp.l9 wind_speed.l9 meantemp.l10 wind_speed.l10
## 0.149206407 0.074555275 -0.030718020 -0.049045011 0.219351729
## const trend sd1 sd2 sd3
## 17.370795528 0.021320033 8.139625406 0.391065418 5.383036420
## sd4 sd5 sd6 sd7
## 1.677946665 10.140458250 0.772150182 0.358336901
##
##
## Estimated coefficients for equation wind_speed:
## ===============================================
## Call:
## wind_speed = meantemp.l1 + wind_speed.l1 + meantemp.l2 + wind_speed.l2 + meantemp.l3 + wind_speed.l3 + meantemp.l4 + wind_speed.l4 + meantemp.l5 + wind_speed.l5 + meantemp.l6 + wind_speed.l6 + meantemp.l7 + wind_speed.l7 + meantemp.l8 + wind_speed.l8 + meantemp.l9 + wind_speed.l9 + meantemp.l10 + wind_speed.l10 + const + trend + sd1 + sd2 + sd3 + sd4 + sd5 + sd6 + sd7
##
## meantemp.l1 wind_speed.l1 meantemp.l2 wind_speed.l2 meantemp.l3
## 0.069024744 0.184304838 0.131154305 -0.004551666 -0.013554779
## wind_speed.l3 meantemp.l4 wind_speed.l4 meantemp.l5 wind_speed.l5
## 0.128031917 -0.130028419 0.052890178 0.169230094 0.163426050
## meantemp.l6 wind_speed.l6 meantemp.l7 wind_speed.l7 meantemp.l8
## -0.075300255 0.027147295 -0.251302983 -0.079158189 0.040539603
## wind_speed.l8 meantemp.l9 wind_speed.l9 meantemp.l10 wind_speed.l10
## -0.037406248 0.047408589 0.141711053 -0.037567017 0.127941422
## const trend sd1 sd2 sd3
## 20.605673892 0.019403293 -7.616951554 -0.788899882 -7.446507564
## sd4 sd5 sd6 sd7
## 1.948843755 -0.822851441 -6.381095605 -8.345748018
summary(VAR_est)##
## VAR Estimation Results:
## =========================
## Endogenous variables: meantemp, wind_speed
## Deterministic variables: both
## Sample size: 198
## Log Likelihood: -1549.651
## Roots of the characteristic polynomial:
## 0.9788 0.9788 0.9048 0.9048 0.877 0.877 0.8366 0.8366 0.8299 0.8299 0.8002 0.8002 0.7419 0.7419 0.6714 0.654 0.654 0.5891 0.5891 0.1924
## Call:
## VAR(y = VAR_data, p = 10, type = "both", season = 8L)
##
##
## Estimation results for equation meantemp:
## =========================================
## meantemp = meantemp.l1 + wind_speed.l1 + meantemp.l2 + wind_speed.l2 + meantemp.l3 + wind_speed.l3 + meantemp.l4 + wind_speed.l4 + meantemp.l5 + wind_speed.l5 + meantemp.l6 + wind_speed.l6 + meantemp.l7 + wind_speed.l7 + meantemp.l8 + wind_speed.l8 + meantemp.l9 + wind_speed.l9 + meantemp.l10 + wind_speed.l10 + const + trend + sd1 + sd2 + sd3 + sd4 + sd5 + sd6 + sd7
##
## Estimate Std. Error t value Pr(>|t|)
## meantemp.l1 0.689510 0.078189 8.818 1.38e-15 ***
## wind_speed.l1 -0.036305 0.067738 -0.536 0.592687
## meantemp.l2 0.122054 0.093322 1.308 0.192691
## wind_speed.l2 0.130967 0.066944 1.956 0.052068 .
## meantemp.l3 0.049179 0.093036 0.529 0.597774
## wind_speed.l3 0.046574 0.066730 0.698 0.486168
## meantemp.l4 -0.002113 0.092811 -0.023 0.981865
## wind_speed.l4 0.110697 0.066391 1.667 0.097299 .
## meantemp.l5 0.238902 0.092815 2.574 0.010912 *
## wind_speed.l5 -0.050101 0.066979 -0.748 0.455495
## meantemp.l6 -0.146334 0.091951 -1.591 0.113380
## wind_speed.l6 0.144786 0.066433 2.179 0.030683 *
## meantemp.l7 -0.113654 0.091656 -1.240 0.216691
## wind_speed.l7 -0.002847 0.066785 -0.043 0.966049
## meantemp.l8 -0.148801 0.092809 -1.603 0.110736
## wind_speed.l8 0.149206 0.066382 2.248 0.025890 *
## meantemp.l9 0.074555 0.091103 0.818 0.414301
## wind_speed.l9 -0.030718 0.066459 -0.462 0.644525
## meantemp.l10 -0.049045 0.072711 -0.675 0.500904
## wind_speed.l10 0.219352 0.067240 3.262 0.001337 **
## const 17.370796 4.941341 3.515 0.000564 ***
## trend 0.021320 0.016078 1.326 0.186623
## sd1 8.139625 3.773365 2.157 0.032409 *
## sd2 0.391065 3.684293 0.106 0.915594
## sd3 5.383036 3.758591 1.432 0.153935
## sd4 1.677947 3.559751 0.471 0.637987
## sd5 10.140458 3.762532 2.695 0.007747 **
## sd6 0.772150 3.686816 0.209 0.834361
## sd7 0.358337 3.714385 0.096 0.923259
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
##
## Residual standard error: 12.26 on 169 degrees of freedom
## Multiple R-Squared: 0.9453, Adjusted R-squared: 0.9362
## F-statistic: 104.2 on 28 and 169 DF, p-value: < 2.2e-16
##
##
## Estimation results for equation wind_speed:
## ===========================================
## wind_speed = meantemp.l1 + wind_speed.l1 + meantemp.l2 + wind_speed.l2 + meantemp.l3 + wind_speed.l3 + meantemp.l4 + wind_speed.l4 + meantemp.l5 + wind_speed.l5 + meantemp.l6 + wind_speed.l6 + meantemp.l7 + wind_speed.l7 + meantemp.l8 + wind_speed.l8 + meantemp.l9 + wind_speed.l9 + meantemp.l10 + wind_speed.l10 + const + trend + sd1 + sd2 + sd3 + sd4 + sd5 + sd6 + sd7
##
## Estimate Std. Error t value Pr(>|t|)
## meantemp.l1 0.069025 0.092992 0.742 0.458956
## wind_speed.l1 0.184305 0.080562 2.288 0.023391 *
## meantemp.l2 0.131154 0.110990 1.182 0.238993
## wind_speed.l2 -0.004552 0.079617 -0.057 0.954478
## meantemp.l3 -0.013555 0.110649 -0.123 0.902647
## wind_speed.l3 0.128032 0.079363 1.613 0.108556
## meantemp.l4 -0.130028 0.110382 -1.178 0.240458
## wind_speed.l4 0.052890 0.078960 0.670 0.503879
## meantemp.l5 0.169230 0.110387 1.533 0.127130
## wind_speed.l5 0.163426 0.079660 2.052 0.041756 *
## meantemp.l6 -0.075300 0.109359 -0.689 0.492044
## wind_speed.l6 0.027147 0.079010 0.344 0.731579
## meantemp.l7 -0.251303 0.109008 -2.305 0.022362 *
## wind_speed.l7 -0.079158 0.079429 -0.997 0.320386
## meantemp.l8 0.040540 0.110380 0.367 0.713874
## wind_speed.l8 -0.037406 0.078950 -0.474 0.636255
## meantemp.l9 0.047409 0.108350 0.438 0.662271
## wind_speed.l9 0.141711 0.079041 1.793 0.074780 .
## meantemp.l10 -0.037567 0.086477 -0.434 0.664540
## wind_speed.l10 0.127941 0.079970 1.600 0.111495
## const 20.605674 5.876827 3.506 0.000582 ***
## trend 0.019403 0.019122 1.015 0.311698
## sd1 -7.616952 4.487732 -1.697 0.091484 .
## sd2 -0.788900 4.381796 -0.180 0.857337
## sd3 -7.446508 4.470160 -1.666 0.097601 .
## sd4 1.948844 4.233676 0.460 0.645879
## sd5 -0.822851 4.474847 -0.184 0.854325
## sd6 -6.381096 4.384797 -1.455 0.147448
## sd7 -8.345748 4.417585 -1.889 0.060576 .
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
##
## Residual standard error: 14.58 on 169 degrees of freedom
## Multiple R-Squared: 0.5074, Adjusted R-squared: 0.4258
## F-statistic: 6.217 on 28 and 169 DF, p-value: 8.952e-15
##
##
##
## Covariance matrix of residuals:
## meantemp wind_speed
## meantemp 150.37 49.25
## wind_speed 49.25 212.69
##
## Correlation matrix of residuals:
## meantemp wind_speed
## meantemp 1.0000 0.2754
## wind_speed 0.2754 1.0000
summary(VAR_est$varresult)## Length Class Mode
## meantemp 12 lm list
## wind_speed 12 lm list
Based on the model summary, we can look at the value of metrics obtained by the model when we use lag 10:
lagselect$criteria[,10]## AIC(n) HQ(n) SC(n) FPE(n)
## 10.60996 11.05362 11.70605 40791.90935
The \(R^2\) score of the model after applying can also be reported for both of the time series used:
summary(VAR_est$varresult$meantemp)$adj.r.squared## [1] 0.9361864
summary(VAR_est$varresult$wind_speed)$adj.r.squared## [1] 0.4258064
We test that the residuals are uncorrelated using a Portmanteau test.
serial.test(VAR_est, lags.pt=10, type="PT.asymptotic")##
## Portmanteau Test (asymptotic)
##
## data: Residuals of VAR object VAR_est
## Chi-squared = 7.3153, df = 0, p-value < 2.2e-16
forecasts <- predict(VAR_est, h=114)forecast <- VAR_est |> forecast(h=18)Now we use test data to evaluate predictions:
predicted_meantemp <- as.numeric(forecast[2]$forecast$meantemp$mean)
actual_meantemp <- as.numeric(ts_week_test_meantemp)
predicted_windspeed <- as.numeric(forecast[2]$forecast$wind_speed$mean)
actual_winspeed <- as.numeric(ts_week_test_windspeed)rmse(predicted_meantemp, actual_meantemp)## [1] 73.53212
rmse(predicted_windspeed, actual_winspeed)## [1] 19.90842
mae(predicted_meantemp, actual_meantemp)## [1] 57.97236
mae(predicted_windspeed, actual_winspeed)## [1] 14.47499
rsq <- function (x, y) cor(x, y) ^ 2
rsq(predicted_meantemp, actual_meantemp)## [1] 0.3756512
rsq(predicted_windspeed, actual_winspeed)## [1] 0.1374916
ts_plot(forecast[2]$forecast$meantemp$mean, ts_week_test_meantemp)ts_plot(forecast[2]$forecast$wind_speed$mean, ts_week_test_windspeed)plot(forecasts)forecast[2]$forecast$meantemp |> autoplot() + autolayer(ts_week_test_meantemp)forecast[2]$forecast$wind_speed |> autoplot() + autolayer(ts_week_test_windspeed)#VAR_EQ1 <- dynlm(ts_train_meantemp ~ #L(ts_train_meantemp, 1:2) + L(ts_train_windspeed, 1:2), #
# start = c(2013-01-01), end = c(2012, #4))
#VAR_EQ2 <- dynlm(ts_train_windspeed ~ #L(ts_train_meantemp, 1:2) + L(ts_train_windspeed, 1:2),
# start = #decimal_date(ymd("2013-01-01")), end = start = #decimal_date(ymd("2017-01-01")))Granger_meantemp <- causality(VAR_est, cause = "meantemp")
Granger_meantemp## $Granger
##
## Granger causality H0: meantemp do not Granger-cause wind_speed
##
## data: VAR object VAR_est
## F-Test = 3.0682, df1 = 10, df2 = 338, p-value = 0.0009546
##
##
## $Instant
##
## H0: No instantaneous causality between: meantemp and wind_speed
##
## data: VAR object VAR_est
## Chi-squared = 13.96, df = 1, p-value = 0.0001867
Granger_windspeed <- causality(VAR_est, cause = "wind_speed")
Granger_windspeed## $Granger
##
## Granger causality H0: wind_speed do not Granger-cause meantemp
##
## data: VAR object VAR_est
## F-Test = 2.5126, df1 = 10, df2 = 338, p-value = 0.006333
##
##
## $Instant
##
## H0: No instantaneous causality between: wind_speed and meantemp
##
## data: VAR object VAR_est
## Chi-squared = 13.96, df = 1, p-value = 0.0001867
FEVD1 <- fevd(VAR_est, n.ahead = 50)
FEVD1## $meantemp
## meantemp wind_speed
## [1,] 1.0000000 0.000000000
## [2,] 0.9988206 0.001179378
## [3,] 0.9921833 0.007816717
## [4,] 0.9830339 0.016966062
## [5,] 0.9614487 0.038551268
## [6,] 0.9588092 0.041190775
## [7,] 0.9377211 0.062278922
## [8,] 0.9134910 0.086509021
## [9,] 0.8625721 0.137427893
## [10,] 0.8254314 0.174568631
## [11,] 0.7681499 0.231850119
## [12,] 0.7173018 0.282698208
## [13,] 0.6756769 0.324323057
## [14,] 0.6306769 0.369323114
## [15,] 0.5968982 0.403101815
## [16,] 0.5643771 0.435622867
## [17,] 0.5380624 0.461937591
## [18,] 0.5178299 0.482170088
## [19,] 0.5033850 0.496614955
## [20,] 0.4933082 0.506691830
## [21,] 0.4867732 0.513226817
## [22,] 0.4843880 0.515611965
## [23,] 0.4855313 0.514468711
## [24,] 0.4899476 0.510052413
## [25,] 0.4961818 0.503818245
## [26,] 0.5030985 0.496901511
## [27,] 0.5102635 0.489736522
## [28,] 0.5170835 0.482916510
## [29,] 0.5228099 0.477190125
## [30,] 0.5271072 0.472892785
## [31,] 0.5294044 0.470595590
## [32,] 0.5297320 0.470268018
## [33,] 0.5281879 0.471812079
## [34,] 0.5249426 0.475057380
## [35,] 0.5204132 0.479586819
## [36,] 0.5149487 0.485051263
## [37,] 0.5088834 0.491116581
## [38,] 0.5025700 0.497429955
## [39,] 0.4964771 0.503522861
## [40,] 0.4909221 0.509077920
## [41,] 0.4861755 0.513824529
## [42,] 0.4823865 0.517613461
## [43,] 0.4796081 0.520391866
## [44,] 0.4779120 0.522088020
## [45,] 0.4772869 0.522713081
## [46,] 0.4776174 0.522382632
## [47,] 0.4787252 0.521274770
## [48,] 0.4804009 0.519599086
## [49,] 0.4824317 0.517568315
## [50,] 0.4846137 0.515386330
##
## $wind_speed
## meantemp wind_speed
## [1,] 0.07585475 0.9241452
## [2,] 0.08405805 0.9159420
## [3,] 0.10823066 0.8917693
## [4,] 0.12850993 0.8714901
## [5,] 0.12843197 0.8715680
## [6,] 0.16336265 0.8366374
## [7,] 0.17888075 0.8211192
## [8,] 0.18094166 0.8190583
## [9,] 0.18051164 0.8194884
## [10,] 0.17419141 0.8258086
## [11,] 0.16695287 0.8330471
## [12,] 0.16518175 0.8348183
## [13,] 0.16507974 0.8349203
## [14,] 0.16645482 0.8335452
## [15,] 0.16453433 0.8354657
## [16,] 0.16225321 0.8377468
## [17,] 0.16407736 0.8359226
## [18,] 0.17090974 0.8290903
## [19,] 0.17604569 0.8239543
## [20,] 0.17960182 0.8203982
## [21,] 0.18265705 0.8173429
## [22,] 0.18690905 0.8130910
## [23,] 0.19438529 0.8056147
## [24,] 0.20045512 0.7995449
## [25,] 0.20333657 0.7966634
## [26,] 0.20576390 0.7942361
## [27,] 0.20819005 0.7918100
## [28,] 0.21018765 0.7898124
## [29,] 0.21158046 0.7884195
## [30,] 0.21171277 0.7882872
## [31,] 0.21107900 0.7889210
## [32,] 0.21025594 0.7897441
## [33,] 0.20925443 0.7907456
## [34,] 0.20816943 0.7918306
## [35,] 0.20728517 0.7927148
## [36,] 0.20647236 0.7935276
## [37,] 0.20573784 0.7942622
## [38,] 0.20529778 0.7947022
## [39,] 0.20534235 0.7946577
## [40,] 0.20599918 0.7940008
## [41,] 0.20698159 0.7930184
## [42,] 0.20802076 0.7919792
## [43,] 0.20925384 0.7907462
## [44,] 0.21078009 0.7892199
## [45,] 0.21247569 0.7875243
## [46,] 0.21413925 0.7858607
## [47,] 0.21558787 0.7844121
## [48,] 0.21681327 0.7831867
## [49,] 0.21789182 0.7821082
## [50,] 0.21877383 0.7812262
plot(FEVD1)set.seed(34)
# nnetar() requires a numeric vector or time series object as
# input ?nnetar() can be seen for more info on the function
# nnetar() by default fits multiple neural net models and
# gives averaged results xreg option allows for only numeric
# vectors in nnetar() function
fit_windspeed = nnetar(ts_train_windspeed)fit_windspeed## Series: ts_train_windspeed
## Model: NNAR(1,1,2)[365]
## Call: nnetar(y = ts_train_windspeed)
##
## Average of 20 networks, each of which is
## a 2-2-1 network with 9 weights
## options were - linear output units
##
## sigma^2 estimated as 14.78
forecast_windspeed <- forecast(fit_windspeed, h = 114, PI = T)
forecast_windspeed## Point Forecast Lo 80 Hi 80 Lo 95 Hi 95
## 2017.0028 2.695754 -2.4417847 7.661440 -4.9298169 9.836679
## 2017.0056 4.108027 -1.2295638 9.744961 -4.2790012 12.524519
## 2017.0083 4.972615 -0.7283696 10.417269 -3.6958952 13.323979
## 2017.0110 5.559379 -0.3283564 11.441156 -3.6630063 15.064046
## 2017.0138 5.870666 0.1855882 11.967556 -3.4003592 14.908291
## 2017.0165 6.929822 0.7995860 11.762953 -2.7631734 14.902749
## 2017.0192 7.472025 0.7913296 12.416056 -2.4445398 15.432587
## 2017.0220 6.985192 0.8431421 12.394983 -2.4940523 15.925438
## 2017.0247 6.707453 0.5845201 11.669544 -2.1497730 14.356694
## 2017.0275 7.308517 1.0242328 12.603940 -2.3634123 15.470643
## 2017.0302 7.336384 0.5678970 12.661313 -2.7806676 15.351510
## 2017.0329 7.196923 0.8941281 12.792767 -2.4727908 16.213221
## 2017.0357 7.639497 0.8850727 12.800908 -1.8298810 15.911132
## 2017.0384 7.854527 1.6405077 12.970126 -1.4299875 15.923839
## 2017.0412 7.155729 1.5410958 12.678056 -2.2091816 15.360632
## 2017.0439 6.794908 0.9308665 12.648085 -1.7532790 15.645562
## 2017.0466 6.929512 1.1510558 12.373679 -2.5604730 15.175219
## 2017.0494 7.427624 0.8431290 12.521468 -2.1842559 15.737880
## 2017.0521 7.699247 1.4696978 12.943855 -2.0638289 15.555218
## 2017.0548 7.348554 1.3325455 12.853673 -2.2721439 16.029127
## 2017.0576 7.003019 0.4646883 12.617498 -2.7207436 16.112501
## 2017.0603 7.473798 0.5854836 12.556436 -2.0100389 15.819167
## 2017.0631 7.296383 0.6118948 12.352426 -2.6209937 15.686301
## 2017.0658 6.880643 0.5660392 12.288874 -2.4170166 14.802312
## 2017.0685 6.909790 0.6650184 12.368304 -2.8729290 15.331345
## 2017.0713 7.042321 0.7081228 12.578954 -3.2506112 15.544328
## 2017.0740 6.748841 0.7118609 12.497913 -2.5480819 15.230259
## 2017.0767 6.577807 0.4814809 11.863290 -2.8335175 15.174263
## 2017.0795 7.288084 0.6724498 12.653592 -2.1177767 15.590773
## 2017.0822 7.623213 0.7202819 12.920043 -2.4235426 15.340806
## 2017.0850 7.806842 1.1369511 12.975684 -2.2165655 15.389592
## 2017.0877 7.876148 1.4134383 13.172961 -1.5788009 16.295697
## 2017.0904 7.900631 1.5934445 12.945334 -2.1920717 16.315826
## 2017.0932 7.181100 1.2195183 12.374415 -1.9640630 14.766000
## 2017.0959 7.042827 0.9345506 12.584050 -2.1256923 15.447777
## 2017.0986 6.754483 1.0446665 12.568535 -2.2288296 15.146524
## 2017.1014 7.323631 1.0553711 12.437084 -2.2434565 15.292713
## 2017.1041 7.613889 1.2037130 12.682406 -2.0182783 16.000992
## 2017.1069 7.033971 0.8361356 12.019635 -2.0108049 15.615054
## 2017.1096 7.475154 0.9730575 12.623104 -2.8192723 15.431037
## 2017.1123 6.953682 0.4239370 12.469425 -2.1221589 15.517499
## 2017.1151 7.422985 0.4593655 12.677435 -2.7553510 16.161747
## 2017.1178 7.699793 0.9611020 12.857462 -2.3085432 15.577129
## 2017.1206 7.800725 0.6214774 13.154049 -2.0551500 16.259838
## 2017.1233 7.838008 0.9542557 12.736510 -2.2807104 16.048319
## 2017.1260 7.428812 1.0067764 12.350592 -2.3754303 15.921744
## 2017.1288 7.503051 0.7141450 13.152483 -2.4273474 16.147668
## 2017.1315 7.275755 1.0741216 12.624649 -2.2417979 16.127201
## 2017.1342 7.682770 1.2435069 12.725961 -1.5678263 15.906526
## 2017.1370 7.995231 2.0511646 12.938161 -1.0843110 16.448213
## 2017.1397 8.102810 2.2786166 12.909551 -1.2939470 16.292255
## 2017.1425 8.046704 1.9888605 13.286666 -1.4524631 16.315915
## 2017.1452 7.289188 1.4624368 12.663907 -1.4501272 15.552601
## 2017.1479 7.491440 0.9858192 12.536912 -1.7671018 15.715953
## 2017.1507 6.979728 0.4224455 12.380468 -3.1645460 16.007820
## 2017.1534 6.756069 0.7280368 12.550881 -2.6571944 15.517560
## 2017.1561 6.684402 0.8774238 12.440966 -2.1768597 15.679786
## 2017.1589 6.630177 0.7104965 12.681141 -2.9186416 15.382608
## 2017.1616 6.518914 0.3009720 12.293955 -3.4167485 15.619091
## 2017.1644 6.458554 0.4618702 11.886139 -2.4410793 14.579348
## 2017.1671 6.664711 0.6581474 12.214668 -2.0040747 15.465415
## 2017.1698 7.291801 0.7841005 12.608221 -2.5528433 15.267596
## 2017.1726 7.687993 0.8079273 12.957027 -2.3138259 16.163571
## 2017.1753 7.743951 0.5952887 12.951019 -3.0551785 15.961514
## 2017.1780 7.764404 1.0104846 12.783052 -1.4913199 15.374850
## 2017.1808 7.844247 0.8684740 12.703256 -2.1093547 16.021531
## 2017.1835 7.918418 1.4343036 12.878412 -1.6661857 15.638995
## 2017.1863 7.934022 1.6269026 13.222783 -1.5622707 16.159223
## 2017.1890 7.923432 1.6173353 13.342042 -2.5203847 16.583299
## 2017.1917 7.957489 1.5126900 13.673262 -2.1906285 16.829899
## 2017.1945 7.938535 1.4348877 13.441025 -2.0244963 16.336715
## 2017.1972 7.921969 1.7248137 13.226961 -1.4866569 16.672133
## 2017.1999 7.944581 1.3812174 13.078060 -2.1013405 16.087399
## 2017.2027 7.990451 1.7901030 13.299126 -1.7158477 15.714896
## 2017.2054 7.954410 1.6233402 13.271445 -1.4596821 16.421635
## 2017.2082 7.935402 1.1941104 13.027038 -1.6013327 16.306319
## 2017.2109 7.473394 1.2876110 12.870728 -1.7909829 16.236356
## 2017.2136 6.993514 1.0703251 12.403250 -2.0692642 15.372052
## 2017.2164 7.573101 1.8487315 12.627779 -1.3622859 15.956636
## 2017.2191 8.009835 1.8237770 12.826231 -0.9953829 16.358215
## 2017.2219 8.016850 1.8529601 13.080406 -1.4976567 16.373759
## 2017.2246 7.492908 0.9595258 12.741604 -1.8713115 15.456522
## 2017.2273 7.178484 1.2859770 12.368695 -2.2340430 14.926842
## 2017.2301 7.605809 1.6014202 12.705382 -1.7714555 15.618277
## 2017.2328 7.802701 1.5872852 13.547518 -1.3407325 16.573391
## 2017.2355 7.860022 1.3769302 13.206567 -2.8171127 15.761119
## 2017.2383 7.897365 1.5231585 13.296777 -2.2143711 15.963951
## 2017.2410 7.911841 1.0851192 13.122118 -2.2350915 16.233997
## 2017.2438 7.510745 0.8731561 12.920749 -2.8962622 16.338125
## 2017.2465 7.810005 1.7071578 12.913370 -1.2957235 15.854324
## 2017.2492 7.843728 1.4070556 13.460894 -2.0208755 16.052964
## 2017.2520 7.178274 1.1977695 13.341320 -2.2613616 15.882117
## 2017.2547 7.582406 1.0731673 12.855431 -1.7042210 15.594721
## 2017.2574 7.793469 1.6820351 13.265570 -2.2417448 15.894497
## 2017.2602 7.391535 1.2071083 13.035364 -1.4398555 15.854049
## 2017.2629 7.731653 1.6563569 13.396911 -1.0918029 15.921903
## 2017.2657 7.483562 1.3109656 13.266748 -2.0963349 16.014070
## 2017.2684 7.809046 1.6233894 12.895092 -0.8109568 15.471590
## 2017.2711 7.823045 1.1873853 12.908741 -2.1912479 15.791521
## 2017.2739 8.039983 1.5272359 13.161516 -1.4079617 16.548171
## 2017.2766 8.210292 1.9358202 12.978454 -0.5688385 15.548255
## 2017.2793 8.277965 2.0589907 13.293331 -1.2105057 16.306115
## 2017.2821 8.139266 2.3444638 13.174449 -0.9132949 16.244072
## 2017.2848 8.064705 1.8170145 13.106296 -0.9338377 16.123462
## 2017.2876 8.050809 2.0366125 13.354405 -0.6184289 15.982777
## 2017.2903 8.016150 2.0043561 12.854756 -1.4228079 16.435408
## 2017.2930 7.970796 1.7489570 12.917603 -1.4284942 15.589856
## 2017.2958 7.606301 1.4294389 12.681653 -2.2631179 15.257186
## 2017.2985 7.895260 1.7814910 12.933244 -1.3689412 16.715634
## 2017.3013 8.067839 1.7854803 13.047691 -1.2714116 16.031519
## 2017.3040 8.009974 2.1212214 13.219534 -0.9885145 16.398142
## 2017.3067 8.200472 2.0783011 13.137535 -1.0644745 16.112288
## 2017.3095 8.099003 1.8527605 12.842223 -0.8119561 16.121128
## 2017.3122 8.007971 2.1836598 13.140810 -1.2820778 16.250347
fit_meantemp = nnetar(ts_train_meantemp)
fit_meantemp## Series: ts_train_meantemp
## Model: NNAR(11,1,6)[365]
## Call: nnetar(y = ts_train_meantemp)
##
## Average of 20 networks, each of which is
## a 12-6-1 network with 85 weights
## options were - linear output units
##
## sigma^2 estimated as 1.884
forecast_meantemp <- forecast(fit_meantemp, h = 114, PI = T)
forecast_meantemp## Point Forecast Lo 80 Hi 80 Lo 95 Hi 95
## 2017.0028 14.40261 12.67282 16.22559 11.628219 17.32989
## 2017.0056 14.76241 12.59151 17.21406 11.397691 18.75455
## 2017.0083 14.74025 12.30468 17.80307 10.934184 19.38441
## 2017.0110 14.86117 12.22824 17.85583 11.055098 19.67392
## 2017.0138 14.78790 12.00794 18.08210 10.628404 19.60861
## 2017.0165 14.75902 12.12591 18.16886 10.590295 19.70237
## 2017.0192 14.54362 11.83540 18.41611 10.092681 19.77308
## 2017.0220 14.54062 11.56089 18.48419 9.958638 20.39465
## 2017.0247 14.59976 11.59163 18.67660 10.027744 20.51988
## 2017.0275 14.88214 11.91950 19.24612 10.467832 21.14797
## 2017.0302 15.13408 12.25430 19.52811 10.788213 21.49064
## 2017.0329 15.18030 12.33597 19.53514 10.704289 21.39711
## 2017.0357 14.86967 12.12913 19.27309 10.334098 21.16061
## 2017.0384 14.65818 11.94595 18.96512 10.062904 21.02950
## 2017.0412 14.49290 11.92215 19.02006 9.826292 20.73199
## 2017.0439 14.30575 11.81606 18.90817 9.843977 20.71919
## 2017.0466 14.01852 11.43643 18.63737 9.631121 20.63341
## 2017.0494 13.79330 11.46816 18.53373 9.675019 20.57327
## 2017.0521 13.57984 11.46162 18.61402 9.833542 20.45278
## 2017.0548 13.34454 11.39687 18.09339 9.776658 20.25331
## 2017.0576 13.31527 11.38643 18.31615 9.742975 19.92353
## 2017.0603 13.02799 11.10796 18.00865 9.534888 19.93417
## 2017.0631 12.77280 10.85837 18.03444 9.515180 19.86498
## 2017.0658 12.81157 11.05112 18.15315 9.450642 19.92513
## 2017.0685 13.24655 11.58390 18.47801 10.185311 20.53420
## 2017.0713 13.55447 11.76358 18.54141 10.431893 20.61397
## 2017.0740 14.16696 12.27245 19.09713 10.694929 20.99242
## 2017.0767 14.77410 12.78033 19.38110 11.004748 21.21839
## 2017.0795 14.98511 12.64128 19.61070 10.739518 21.53050
## 2017.0822 14.95012 12.53490 19.25849 10.820177 21.48438
## 2017.0850 14.94488 12.33675 19.32406 10.790565 21.22853
## 2017.0877 14.80391 12.02130 19.10446 10.560579 21.20121
## 2017.0904 14.98744 12.18445 19.30450 10.414015 21.22510
## 2017.0932 14.93798 11.96237 19.12431 10.628752 21.05182
## 2017.0959 15.12729 12.41351 19.29108 10.879389 21.32949
## 2017.0986 15.29162 12.28513 19.41999 10.760534 21.18127
## 2017.1014 15.57476 12.59266 19.62786 10.940590 21.52252
## 2017.1041 15.75122 12.44737 19.70242 10.947575 21.98885
## 2017.1069 15.95475 12.65734 19.95630 11.002571 22.09880
## 2017.1096 16.21872 12.97329 20.48833 11.410082 22.46924
## 2017.1123 16.47921 13.25118 20.56600 11.663333 22.65181
## 2017.1151 16.49584 13.15274 20.33003 11.570385 22.70866
## 2017.1178 16.59599 12.92263 20.52337 11.480665 22.77834
## 2017.1206 16.69998 13.17708 20.64427 11.357418 22.80492
## 2017.1233 16.88312 13.30367 20.55877 11.731665 23.06946
## 2017.1260 17.08633 13.48517 20.78597 11.784190 23.16774
## 2017.1288 17.37409 13.75044 21.26351 12.050884 23.62206
## 2017.1315 17.95159 14.33237 21.74896 12.779050 24.31907
## 2017.1342 18.52350 14.83863 22.48842 13.203691 24.73845
## 2017.1370 18.75079 14.90727 22.31200 12.907013 24.94238
## 2017.1397 18.95458 15.11039 22.76713 13.017044 24.43318
## 2017.1425 19.21754 15.19493 23.15059 13.221398 25.06946
## 2017.1452 19.52072 15.28448 23.44169 13.223371 25.52832
## 2017.1479 19.90508 15.58456 23.60652 13.430859 25.50789
## 2017.1507 20.32247 15.55019 23.83666 14.190290 26.01082
## 2017.1534 20.73643 15.91428 24.35484 13.878610 26.31850
## 2017.1561 21.15139 16.28011 24.51647 14.331323 26.63793
## 2017.1589 21.55751 16.33438 24.78711 14.447072 26.95203
## 2017.1616 22.01488 16.99167 25.14743 14.641323 27.39066
## 2017.1644 22.48887 17.19875 25.41234 15.305786 27.71382
## 2017.1671 23.00445 17.64124 25.82184 15.666180 27.90849
## 2017.1698 23.56317 18.17433 26.14321 16.363714 28.56896
## 2017.1726 23.92109 18.30926 26.48069 16.139619 28.20389
## 2017.1753 24.19324 18.25804 26.43096 16.212182 28.13583
## 2017.1780 24.57094 18.37500 26.73846 16.134900 28.43976
## 2017.1808 24.94124 18.67931 26.85689 16.292275 28.35964
## 2017.1835 25.32858 18.75822 27.06049 16.741776 28.81428
## 2017.1863 25.77044 19.15482 27.16041 16.961971 29.16146
## 2017.1890 26.11331 19.43529 27.26610 17.113166 29.14834
## 2017.1917 26.30230 19.60362 27.63313 17.629353 29.58310
## 2017.1945 26.38324 19.56939 27.84248 17.315040 29.50278
## 2017.1972 26.27081 19.64886 27.85909 17.522646 29.60219
## 2017.1999 26.29398 19.83994 27.95816 17.245661 29.54453
## 2017.2027 26.31573 19.99605 28.08901 17.638956 29.69721
## 2017.2054 26.56394 20.37696 28.44289 18.163556 29.99422
## 2017.2082 26.73101 20.47046 28.68163 18.336181 30.68183
## 2017.2109 26.88068 20.91773 28.91552 18.774254 30.76906
## 2017.2136 26.99252 21.49126 29.02492 19.326982 31.07841
## 2017.2164 27.02817 21.68721 29.36545 19.342604 31.06744
## 2017.2191 26.86131 21.63804 29.56030 19.876803 31.26494
## 2017.2219 26.83255 21.91002 29.64306 19.698026 31.36219
## 2017.2246 27.01885 22.35700 29.70712 19.768005 31.63541
## 2017.2273 27.42045 22.71184 30.20386 20.449490 31.53277
## 2017.2301 27.20077 22.71593 29.95251 20.714522 31.63603
## 2017.2328 27.08784 22.92085 29.91181 20.902035 31.54458
## 2017.2355 26.97187 22.80657 30.22567 21.025138 31.67906
## 2017.2383 27.04197 23.05418 30.20233 20.733862 32.04850
## 2017.2410 27.38614 23.33082 30.80818 21.099323 32.49956
## 2017.2438 27.70772 23.94771 30.82058 21.705815 32.72416
## 2017.2465 27.91390 24.29036 31.06620 21.961657 32.75434
## 2017.2492 28.31836 24.61059 31.16088 22.805851 33.15013
## 2017.2520 28.71487 25.03504 31.67153 23.247816 33.22239
## 2017.2547 28.98276 25.21270 31.62695 23.264059 33.41548
## 2017.2574 29.18605 25.40416 32.05232 23.660160 33.60069
## 2017.2602 29.30443 25.43630 32.30317 23.586945 33.73245
## 2017.2629 29.34430 25.67540 32.08132 23.742480 33.68013
## 2017.2657 29.39678 25.74745 32.17360 23.852989 33.38321
## 2017.2684 29.58463 26.03422 32.36572 24.334863 33.66769
## 2017.2711 29.84763 26.23320 32.43310 24.692280 34.11857
## 2017.2739 29.83789 26.19317 32.35816 24.598184 34.23307
## 2017.2766 29.82691 26.31067 32.42723 24.624137 34.21494
## 2017.2793 29.91412 26.55706 32.53559 24.767040 34.17662
## 2017.2821 30.10860 26.71260 32.87281 25.205454 34.52378
## 2017.2848 30.32781 26.95699 33.09301 25.305162 34.63813
## 2017.2876 30.64443 27.08691 33.40322 25.669354 34.88612
## 2017.2903 30.78645 27.38368 33.30445 25.843993 35.02817
## 2017.2930 30.89339 27.50662 33.54483 26.114928 35.13933
## 2017.2958 30.94139 27.82334 33.83185 26.187123 35.45357
## 2017.2985 31.01328 27.69836 33.85723 25.939577 35.76170
## 2017.3013 31.09461 27.70791 33.91793 26.343458 35.54506
## 2017.3040 31.12428 27.84847 33.88121 26.221704 35.72485
## 2017.3067 31.03710 27.91398 34.00426 26.380597 35.53111
## 2017.3095 30.91446 27.92555 34.02706 26.342614 35.70284
## 2017.3122 30.82284 27.77221 34.07422 26.266969 35.42013
predicted <- as.numeric(forecast_meantemp$mean)
actual <- as.numeric(ts_test_meantemp)rmse(predicted, actual)## [1] 3.134642
mae(predicted, actual)## [1] 2.456404
rsq <- function (x, y) cor(x, y) ^ 2
rsq(actual, predicted)## [1] 0.7723737
predicted <- as.numeric(forecast_windspeed$mean)
actual <- as.numeric(ts_test_windspeed)rmse(predicted, actual)## [1] 3.544034
mae(predicted, actual)## [1] 2.761422
rsq(actual, predicted)## [1] 0.05910477
PLOT
forecast_windspeed |> autoplot() + autolayer(ts_test_windspeed)forecast_meantemp |> autoplot() + autolayer(ts_test_meantemp)